Kernel center adaptation in the reproducing kernel Hilbert space embedding method

نویسندگان

چکیده

Summary The performance of adaptive estimators that employ embedding in reproducing kernel Hilbert spaces (RKHS) depends on the choice location basis centers. Parameter convergence and error approximation rates depend where how centers are distributed state‐space. In this article, we develop theory relates parameter to position We criteria for choosing a specific class systems by exploiting fact state trajectory regularly visits neighborhood positive limit set. Two algorithms, based centroidal Voronoi tessellations Kohonen self‐organizing maps, derived choose RKHS method. Finally, implement these methods two practical examples test their effectiveness.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Reproducing Kernel Space Hilbert Method for Solving Generalized Burgers Equation

In this paper, we present a new method for solving Reproducing Kernel Space (RKS) theory, and iterative algorithm for solving Generalized Burgers Equation (GBE) is presented. The analytical solution is shown in a series in a RKS, and the approximate solution u(x,t) is constructed by truncating the series. The convergence of u(x,t) to the analytical solution is also proved.

متن کامل

Path Integral Control by Reproducing Kernel Hilbert Space Embedding

We present an embedding of stochastic optimal control problems, of the so called path integral form, into reproducing kernel Hilbert spaces. Using consistent, sample based estimates of the embedding leads to a model-free, non-parametric approach for calculation of an approximate solution to the control problem. This formulation admits a decomposition of the problem into an invariant and task de...

متن کامل

Step Size Adaptation in Reproducing Kernel Hilbert Space

This paper presents an online support vector machine (SVM) that uses the stochastic meta-descent (SMD) algorithm to adapt its step size automatically. We formulate the online learning problem as a stochastic gradient descent in reproducing kernel Hilbert space (RKHS) and translate SMD to the nonparametric setting, where its gradient trace parameter is no longer a coefficient vector but an eleme...

متن کامل

Kernel Partial Least Squares Regression in Reproducing Kernel Hilbert Space

A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the kernel partial least squares (PLS) regression model. Similar to principal components regression (PCR), PLS is a method based on the projection of input (explanatory) variables to the latent variables (components). However, in contrast to PCR, PLS creates the components by modeling th...

متن کامل

Subspace Regression in Reproducing Kernel Hilbert Space

We focus on three methods for finding a suitable subspace for regression in a reproducing kernel Hilbert space: kernel principal component analysis, kernel partial least squares and kernel canonical correlation analysis and we demonstrate how this fits within a more general context of subspace regression. For the kernel partial least squares case a least squares support vector machine style der...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: International Journal of Adaptive Control and Signal Processing

سال: 2022

ISSN: ['0890-6327', '1099-1115']

DOI: https://doi.org/10.1002/acs.3407